InfiniteBoost: building infinite ensembles with gradient descent
نویسندگان
چکیده
In machine learning ensemble methods have demonstrated high accuracy for the variety of problems in different areas. The most known algorithms intensively used in practice are random forests and gradient boosting. In this paper we present InfiniteBoost — a novel algorithm, which combines the best properties of these two approaches. The algorithm constructs the ensemble of trees for which two properties hold: trees of the ensemble incorporate the mistakes done by others; at the same time the ensemble could contain the infinite number of trees without the over-fitting effect. The proposed algorithm is evaluated on the regression, classification, and ranking tasks using large scale, publicly available datasets.
منابع مشابه
Stochastic Particle Gradient Descent for Infinite Ensembles
The superior performance of ensemble methods with infinite models are well known. Most of these methods are based on optimization problems in infinite-dimensional spaces with some regularization, for instance, boosting methods and convex neural networks use L1-regularization with the non-negative constraint. However, due to the difficulty of handling L1-regularization, these problems require ea...
متن کاملForest Fire Danger Projections in the Mediterranean using ENSEMBLES Regional Climate Change Scenarios
We present future fire danger scenarios for the countries bordering the Mediterranean areas of Europe and north Africa building on a multi-model ensemble of state-of-the-art regional climate projections from the EU-funded project ENSEMBLES. Fire danger is estimated using the Canadian Forest Fire Weather Index (FWI) System and a related set of indices. To overcome some of the limitations of ENSE...
متن کاملFficient Variational B Ayesian Neural Net - Work Ensembles for Outlier Detection
In this work we perform outlier detection using ensembles of neural networks obtained by variational approximation of the posterior in a Bayesian neural network setting. The variational parameters are obtained by sampling from the true posterior by gradient descent. We show our outlier detection results are comparable to those obtained using other efficient ensembling methods.
متن کاملExtensions of the Hestenes-Stiefel and Polak-Ribiere-Polyak conjugate gradient methods with sufficient descent property
Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...
متن کاملDiversity creation in local search for the evolution of neural network ensembles
The EENCL algorithm [1] automatically designs neural network ensembles for classification, combining global evolution with local search based on gradient descent. Two mechanisms encourage diversity: Negative Correlation Learning (NCL) and implicit fitness sharing. This paper analyses EENCL, finding that NCL is not an essential component of the algorithm, while implicit fitness sharing is. Furth...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1706.01109 شماره
صفحات -
تاریخ انتشار 2017